Company
Location
Porto - Portugal
Job type
Full-Time
Python Job Details
Porto, Portugal - Remote
UTC +00:00
More details
Permanent
Data Scientist
5 - 10+ years of experience - Senior
Visa support
Relocation paid
Language(s): Required English
| Nice to have English
SKILLS
Must have:
Apache Spark
AWS Redshift
Python
Other Required:
Big Data
Data Warehousing
Data Lake
Amazon Web Services
Apache Kafka
PostgreSQL
Nice to have:
SQL
Linux
DESCRIPTION
dxFeed is a leading provider of data services for the capital markets' industry. The company sources and stores direct market data feeds from a variety of exchanges and market participants around the world. dxFeed has built one of the most comprehensive ticker plants, in addition to offering the broadest range of data services for streaming, consolidation, storage, extraction, and analytics. These include index construction and maintenance for buy-side and sell-side institutions of the global financial industry.
We are looking for experienced Data Engineer to help us make better products for our customers. You will be joining a team of professionals with extensive experience in creating financial software products.
Job Description
Responsibilities:
Contribute to development of our in-house cross-functional data storage platform
Establish a data infrastructure strategy to capture and harness new data assets for the new products and technology practices
Collaborate with decision makers, project managers, data scientists, and other stakeholders
Build data pipelines using Apache Airflow, Redshift, Spark, Kafka, and more
Connect new data sources
Maintain data infrastructure (resource management, upgrades)
Monitor data pipelines and improve their performance
REQUIREMENTS
Qualifications
You are proficient in writing SQL on analytics (OLAP) databases (preferably Redshift, Clickhouse)
You are proficient in writing SQL on transactional (OLTP) databases (preferably PostgreSQL)
You are experienced in transforming data model requirements into Big Data solutions
You have worked on efficient cloud-based data warehouse/data lake operation
You are experienced in tasks automation, preferably in Python
You have experience with deployment and support of cloud-based solutions, preferrably AWS+Terraform
You are able to work in self-organizing environment
PERKS
We offer:
Paid vacation leave (22 working days);
Insurance coverage (for you and your children);
Compensation of food vouchers of 9.5 EUR per working day;
Discounted sports membership;
Snacks and beverages in the office;
Modern, well-equipped office;
Flexible schedule;
Work from home opportunity;
Portuguese language courses.
REMOTE DETAILS
To apply for this job you must be willing to work in the time zone UTC +00:00.
100% remote FROM/INSIDE Portugal
UTC +00:00
More details
Permanent
Data Scientist
5 - 10+ years of experience - Senior
Visa support
Relocation paid
Language(s): Required English
| Nice to have English
SKILLS
Must have:
Apache Spark
AWS Redshift
Python
Other Required:
Big Data
Data Warehousing
Data Lake
Amazon Web Services
Apache Kafka
PostgreSQL
Nice to have:
SQL
Linux
DESCRIPTION
dxFeed is a leading provider of data services for the capital markets' industry. The company sources and stores direct market data feeds from a variety of exchanges and market participants around the world. dxFeed has built one of the most comprehensive ticker plants, in addition to offering the broadest range of data services for streaming, consolidation, storage, extraction, and analytics. These include index construction and maintenance for buy-side and sell-side institutions of the global financial industry.
We are looking for experienced Data Engineer to help us make better products for our customers. You will be joining a team of professionals with extensive experience in creating financial software products.
Job Description
Responsibilities:
Contribute to development of our in-house cross-functional data storage platform
Establish a data infrastructure strategy to capture and harness new data assets for the new products and technology practices
Collaborate with decision makers, project managers, data scientists, and other stakeholders
Build data pipelines using Apache Airflow, Redshift, Spark, Kafka, and more
Connect new data sources
Maintain data infrastructure (resource management, upgrades)
Monitor data pipelines and improve their performance
REQUIREMENTS
Qualifications
You are proficient in writing SQL on analytics (OLAP) databases (preferably Redshift, Clickhouse)
You are proficient in writing SQL on transactional (OLTP) databases (preferably PostgreSQL)
You are experienced in transforming data model requirements into Big Data solutions
You have worked on efficient cloud-based data warehouse/data lake operation
You are experienced in tasks automation, preferably in Python
You have experience with deployment and support of cloud-based solutions, preferrably AWS+Terraform
You are able to work in self-organizing environment
PERKS
We offer:
Paid vacation leave (22 working days);
Insurance coverage (for you and your children);
Compensation of food vouchers of 9.5 EUR per working day;
Discounted sports membership;
Snacks and beverages in the office;
Modern, well-equipped office;
Flexible schedule;
Work from home opportunity;
Portuguese language courses.
REMOTE DETAILS
To apply for this job you must be willing to work in the time zone UTC +00:00.
100% remote FROM/INSIDE Portugal
More Developer Job Boards
Fullstack Developer Jobs Golang Jobs JavaScript Jobs Python Jobs React Jobs Rust Jobs Java Jobs